Sparse Reduced Rank Regression With Nonconvex Regularization
نویسندگان
چکیده
In this paper, the estimation problem for sparse reduced rank regression (SRRR) model is considered. The SRRR model is widely used for dimension reduction and variable selection with applications in signal processing, econometrics, etc. The problem is formulated to minimize the least squares loss with a sparsity-inducing penalty considering an orthogonality constraint. Convex sparsity-inducing functions have been used for SRRR in literature. In this work, a nonconvex function is proposed for better sparsity inducing. An efficient algorithm is developed based on the alternating minimization (or projection) method to solve the nonconvex optimization problem. Numerical simulations show that the proposed algorithm is much more efficient compared to the benchmark methods and the nonconvex function can result in a better estimation accuracy. Index Terms Multivariate regression, low-rank, variable selection, factor analysis, nonconvex optimization.
منابع مشابه
Optimal Computational and Statistical Rates of Convergence for Sparse Nonconvex Learning Problems.
We provide theoretical analysis of the statistical and computational properties of penalized M-estimators that can be formulated as the solution to a possibly nonconvex optimization problem. Many important estimators fall in this category, including least squares regression with nonconvex regularization, generalized linear models with nonconvex regularization and sparse elliptical random design...
متن کاملSupport recovery without incoherence: A case for nonconvex regularization
We develop a new primal-dual witness proof framework that may be used to establish variable selection consistency and ∞-bounds for sparse regression problems, even when the loss function and regularizer are nonconvex. We use this method to prove two theorems concerning support recovery and ∞-guarantees for a regression estimator in a general setting. Notably, our theory applies to all potential...
متن کاملNonconvex Sparse Logistic Regression with Weakly Convex Regularization
In this work we propose to fit a sparse logistic regression model by a weakly convex regularized nonconvex optimization problem. The idea is based on the finding that a weakly convex function as an approximation of the `0 pseudo norm is able to better induce sparsity than the commonly used `1 norm. For a class of weakly convex sparsity inducing functions, we prove the nonconvexity of the corres...
متن کاملScalable Reduced-rank and Sparse Regression
Reduced-rank regression, i.e., multi-task regression subject to a low-rank constraint, is an effective approach to reduce the number of observations required for estimation consistency. However, it is still possible for the estimated singular vectors to be inconsistent in high dimensions as the number of predictors go to infinity with a faster rate than the number of available observations. Spa...
متن کاملBayesian sparse reduced rank multivariate regression
Many modern statistical problems can be cast in the framework of multivariate regression, where the main task is to make statistical inference for a possibly sparse and low-rank coefficient matrix. The low-rank structure in the coefficient matrix is of intrinsic multivariate nature, which, when combined with sparsity, can further lift dimension reduction, conduct variable selection, and facilit...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2018